![]() Adaptive user interface system
专利摘要:
An adaptive user interface system (2) for a vehicle (3), configured to present information at an information display (4) of said vehicle, the system comprises a control unit (6) for controlling the information presented at said information display (4), and that said information display comprises a plurality of information areas (8), each information area has a predefined position at said information display and is configured for presentation of information area content. The system comprises an eye tracking arrangement (10) configured to determine at least one predefined information area parameter (12) related to a user's visual interaction (14) with information areas (8) and to generate an eye tracking signal (16) to be applied to said control unit (6) including said at least one parameter. The control unit (6) is configured to store the determined information area parameter(s) for each information area, and to analyse said information area parameters over time using a set of analyse rules, and further to determine one or many information areas to be active in dependence of a set of adaptive presentation rules related to a present user, and to control said information display to present information area content of active information areas.(Figure 2) 公开号:SE1451415A1 申请号:SE1451415 申请日:2014-11-24 公开日:2016-05-25 发明作者:Jonatan Fjellström;Simon Katzman;Johanna Vännström;Robert Friberg;Stas Krupenia;Victor Ahlm;Helena Nyberg;Daniele Nicola 申请人:Scania Cv Ab; IPC主号:
专利说明:
Adaptive user interface systemField of the inventionThe present disclosure relates to an adaptive user interface system for a vehicle, anda method in connection with such a system according to the preambles of theindependent c|aims. Backqround of the inventionln modern vehicles the amount of information presented to drivers increase as thenumber of new help systems are integrated in the vehicles. This may result in anincreasing mental workload for the drivers. Some drivers require as muchinformation they can get, and some drivers only need some basic information. Different drivers have different strategies for solving the task of driving the vehicle. Therefore, different drivers require different information and different amount ofinformation to feel secure and comfortable. Information is normally presented at information displays arranged in front of thedriver and below the wind screen. An alternative way of presenting information to a driver is by using so-calledheads-up displays (HUD). HUD consists of projecting information on thewindshield of an automobile, allowing a driver to view the projected informationwithout having to look down at an instrument panel. Although they were initiallydeveloped for military aviation, HUDs are now used in commercial aircraft,automobiles, computer gaming, and other applications. HUD is any transparent display that presents data without requiring users to lookaway from their usual viewpoints. The origin of the name stems from a pilot beingable to view information with the head positioned "up" and looking forward,instead of angled down looking at lower instruments. US-2007/0194902 relates to an adaptive heads-up user interface for automobiles.The interface comprises a number of display elements that may be presented in avariety of display states which are determined based on inputs from a variety of2sources from e.g. the vehicle or based on user interaction and biometricinformation. As discussed above the amount of information that some drivers would like to seemay be large and if presented at the windscreen it can have negative effects ofthe driver's driving capabilities as it may obscure the driver's view. One overall object of the present invention is to take into account a driver'spersonal way of choosing which information he/she would like to see and therebyachieve a system and a method of personalizing the information presented to adriver. Summary of the inventionThe above-mentioned object is achieved, or at least mitigated, by the presentinvention according to the independent claims. Preferred embodiments are set forth in the dependent claims. According to the present invention an adaptive user interface system is achievedhaving capabilities of remembering when the driver choses to look at a certainpiece of information. The system is configured to log the position the driver islooking at by using an eye tracker technique including an integrated so-calledgaze interaction. The system adapts the presented information in accordance withthe result of an analysis of parameters related to the information areas the driverhas been looking at. The adaptation relates both to which information that shouldbe presented and also when the information should be presented.ln one embodiment the information to the driver is presented at the windscreen,i.e. the information is presented at a heads-up display (HUD). One advantage of the present invention is that only information wanted by thedriver is presented. Short description of the appended drawinqsFigure 1 is a schematic illustration of a vehicle comprising the adaptive userinterface system according to the present invention. Figure 2 is a block diagram schematically i|ustrating an adaptive user interfacesystem according to the present invention. Figure 3 is a flow diagram i|ustrating the method according to the presentinvenfion. Detailed description of preferred embodiments of the inventionThe present invention will now be disclosed in detail with references to theappended figures. The present invention relates to an adaptive user interface system 2 applicable foruse in e.g. a vehicle 3 which is shown by the schematic illustration in figure 1. lnthe figure a display 4, a control unit 6 and an eye tracking arrangement 10 isschematically illustrated. Thus, with references to figure 2 the present invention relates to an adaptive userinterface system 2 for a vehicle 3 (see figure 1), e.g. a bus, a truck, or anautomobile, but also for aircraft and boats. The system is configured to present information at an information display 4 of thevehicle. The system comprises a control unit 6 for controlling, via control signals7, the information presented at the information display 4. ln its turn, the controlunit 6 receives information to be presented from various systems of the vehicle. The information display 4 comprises a plurality of information areas 8, where eachinformation area has a predefined position at the information display and isconfigured for presentation of information area content. Thus, the control unit 6 isprovided with data regarding information areas, their respective positions and alsowhich information that is presented at each information area, herein denoted asthe information area content. An information area may have different sizes and4shapes dependent on the information intended to be presented. The informationarea content may be the speed of the vehicle, a temperature indication, trafficwarnings, information related to the surroundings, information of music played atthe radio, etc. The system comprises an eye tracking arrangement 10 configured to determine atleast one predefined information area parameter 12 related to a user's visualinteraction 14 - illustrated by a schematic illustration of a user's eye 15 - withinformation areas 8 and to generate an eye tracking signal 16 to be applied to thecontrol unit 6 including the at least one parameter. The control unit 6 is configuredto control the function of the eye tracking arrangement 10 by control signals 17.Below under a separate heading, different aspects of the eye tracking techniquewill be discussed in detail where also specific issues in relation to using thetechnique in vehicles when implementing the present invention also will bediscussed. The user's visual interaction 14 includes both determining the positionof an information area that the user is looking at, and in addition, also a possibilityof activating a function controlled by an information area content in an informationarea that the user is looking at- i.e. gaze interaction. The at least one predefined information area parameter includes e.g. a parameterstating the time duration of a user fixation of an information area. The control unit 6 is configured to store the determined information areaparameter(s) for each information area, and to analyse the information areaparameters over time using a set of analyse rules. The control unit is further configured to determine one or many information areasto be active in dependence of a set of adaptive presentation rules related to apresent user, and to control the information display, by control signals 7, topresent information area content of active information areas. According to one embodiment the set of analyse rules comprises a rule to performa frequency analysis of user interaction with information areas. The frequencyanalysis may comprise determining a number of times a fixation is made at aspecific information area during a predetermined time period. A fixation is defined5as viewing at a specific information area a time period at least having a duration inthe range of 200-350 ms. The analysis is preferably performed continuously and may include interactioninformation from a predetermined time period, e.g. some hours, or during anentire driving session. An information area may be regarded as active if a number of fixations of thatinformation area are above a predetermined threshold during the predeterminedtime period. Thus, a fixed number of fixations may be determined in order toqualify as an active information area. As an alternative, the predeterminedthreshold may be set such that the number of information areas being active ise.g. the “top ten” areas having most fixations. Thus, the presentation rules include a rule to determine an information area asactive in dependence of the result of said analysing step. According to one embodiment the set of presentation rules include at least onerule including vehicle related parameters, e.g. the speed of the vehicle. Thispresentation rule may e.g. include that the number of active information areasmay be higher when driving at a lower speed, for the reason that at lower speedthe driver may have time to take more information into account than when drivingat higher speed when only few information areas should be shown. As an alternative, when driving at higher speed, e.g. at a motonNay, there is lessdistraction compared to when driving in more complex environments, e.g. in townswhere the driver's full attention is required, and therefore more information areasmay be active. ln that case the speed, in combination with e.g. driving in anessentially straight direction may be taken into account. Also, more advanced functionality may be provided. For example, the systemshould be that smart that it remembers that the driver e.g. ten minutes prior eachstop would like the check where to eat. That information will then always emergebefore it is time to stop. Thus, the present invention provides for relating and/or adapting the presentedinformation to the situation, i.e. relate the piece of information to the situation and6also adapt how the information is presented, e.g. specifically highlight theinformation if the degree of importance is high. E.g. at a red light stop (the speedis low/zero) it would be possible to present a certain type of information. The control unit is preferably configured to store a user's set of adaptivepresentation rules when a user session is terminated, and to use that set ofpresentation rules the next time the same user uses the system. Thereby thepresentation of information is personalized. The driver is normally identified byhis/her driver card. According to one particular embodiment the information display is a so-calledheads-up display (HUD) at a windscreen of the vehicle. The HUD technology willbe further discussed below under a separate heading. As an alternative the information display is an LCD, or other type of display,arranged in front of the driver below the windscreen. The present invention also relates to a method in an adaptive user interfacesystem for a vehicle. The adaptive interface system has been described in detailabove and it is herein referred to that description. Thus, the interface system isconfigured to present information at an information display of the vehicle. Thesystem comprises a control unit for controlling the information presented at theinformation display, and that the information display comprises a plurality ofinformation areas, each information area has a predefined position at theinformation display and is configured for presentation of information area content. With references to the flow diagram shown in figure 3 the method will now bedescribed more in detail. The method comprises determining, by an eye tracking arrangement at least onepredefined information area parameter related to a user's visual interaction withinformation areas, and generating an eye tracking signal including theparameter(s) and storing, in the control unit, the determined information areaparameter(s) for each information area.7The control unit then performs the step of analysing the information areaparameters over time using a set of analyse rules, e.g. by performing frequencyanalysis of user interaction with information areas, by a frequency analysis rule inthe set of analyse rules. Furthermore, the method comprises determining one or many information areas tobe active in dependence of a set of adaptive presentation rules related to apresent user, and by presenting, at the information display, information areacontent of active information areas. More in detail, the presentation rules providefor determining an information area as active in dependence of the result of theanalysing step. The set of presentation rules may e.g. include one or many rules including vehiclerelated parameters, e.g. the speed of the vehicle. The frequency analysis rule comprises determining a number of times a fixation ismade at a specific information area during a predetermined time period. Aninformation area is regarded as active if a number of fixations of that informationarea are above a predetermined threshold during a predetermined time period.The predefined information area parameters include a parameter stating the timeduration of a user fixation of an information area. These aspects are furtherdiscussed above when describing the system. According to one embodiment the method comprises storing a user's set ofadaptive presentation rules when a user session is terminated, and using that setof presentation rules the next time the same user uses the system. As an output from the eye tracking arrangement an attention level of the drivermay be available. This attention level may include a fatigue level of the driver andin the case it might be necessary to “wake” the driver up by e.g. playing musiclouder or by visual indication at the windscreen (HUD). The attention level may also include information of the stress level of the driver,e.g. if the traffic situation is intensive it may be necessary to present lessinformation, or a certain type of information to assist the driver in the best manner.8The present invention also relates to a computer program P (see figure 2) thatcomprises a computer program code to cause the control unit 10, or a computerconnected to the control unit 10, to perform the method described above.ln addition a computer program product is provided comprising a computerprogram code stored on a computer-readable medium to perform the methoddescribed above, when the computer program code is executed by the control unit10 or by a computer connected to the control unit 10. Heads-up display (HUD)A typical HUD comprises three primary components: a projector unit, a combiner,and a video generation computer - herein included in the control unit 6. The projection unit in a typical HUD is an optical collimator setup: a convex lens orconcave mirror with a cathode ray tube, light emitting diode, or liquid crystaldisplay at its focus. This setup produces an image where the light is parallel i.e.perceived to be at infinity. The combiner is typically an angled flat piece ofglass (a beam splitter) locateddirectly in front of the viewer, which redirects the projected image from projector insuch a way as to see the field of view and the projected infinity image at the sametime. Combiners may have special coatings that reflect the monochromatic lightprojected onto it from the projector unit while allowing all other wavelengths oflight to pass through. ln some optical layouts combiners may also have a curvedsurface to refocus the image from the projector. The computer provides the interface between the HUD (i.e. the projection unit)and the systems/data to be displayed and generates the imagery and symbologyto be displayed by the projection unit. Newer micro-display imaging technologies are being introduced, including liquidcrystal display (LCD), liquid crystal on silicon (LCoS), digital micro-mirrors (DMD),and organic light-emitting diode (OLED). Eye-tracker techniqueAn eye tracker system incorporates near-infrared micro-projectors, optical sensorsand image processing. Micro-projectors create reflection pattern on the eyes. Image sensors register theimage of the user, the user's eyes, and the projection patterns, in real time. Image processing is used to find features of the user, the eyes and projectionpatterns. Mathematical models are used to exactly calculate the eyes' position and thegaze point. Eye tracking is the process of measuring either the point of gaze (where one islooking) or the motion of an eye relative to the head. An eye tracker is a device formeasuring eye positions and eye movement. Eye trackers are used in researchon the visual system, in psychology, in cognitive linguistics and in product design.There are a number of methods for measuring eye movement. The most popularvariant uses video images from which the eye position is extracted. Othermethods use search coils or are based on the electro-oculogram. The second broad category uses some non-contact, optical method for measuringeye motion. Light, typically infrared, is reflected from the eye and sensed by avideo camera or some other specially designed optical sensor. The information isthen analysed to extract eye rotation from changes in reflections. Video-basedeye trackers typically use the corneal reflection (the first Purkinje image) and thecentre of the pupil as features to track over time. A more sensitive type of eyetracker, the dual-Purkinje eye tracker, uses reflections from the front of the cornea(first Purkinje image) and the back of the lens (fourth Purkinje image) as featuresto track. A still more sensitive method of tracking is to image features from insidethe eye, such as the retinal blood vessels, and follow these features as the eyerotates. Optical methods, particularly those based on video recording, are widelyused for gaze tracking and are favoured for being non-invasive and inexpensive. The most widely used current designs are video-based eye trackers. A camerafocuses on one or both eyes and records their movement as the viewer looks atsome kind of stimulus. Most modern eye-trackers use the centre of the pupil andinfrared / near-infrared non-co|imated light to create cornea| reflections (CR). Thevector between the pupil centre and the cornea| reflections can be used tocompute the point of regard on surface or the gaze direction. A simple calibrationprocedure of the individual is usually needed before using the eye tracker. Two general types of eye tracking techniques are used: bright-pupil and dark-pupil. Their difference is based on the location of the illumination source withrespect to the optics. lf the illumination is coaxial with the optical path, then theeye acts as a retroreflector as the light reflects off the retina creating a bright pupileffect similar to red eye. lf the illumination source is offset from the optical path,then the pupil appears dark because the retroreflection from the retina is directedaway from the camera. Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eyetracking with all iris pigmentation, and greatly reduces interference caused byeyelashes and other obscuring features. lt also allows tracking in lightingconditions ranging from total darkness to very bright. But bright-pupil techniquesare not effective for tracking outdoors, as extraneous IR sources interfere withmonitoring. Eye-tracking setups vary greatly; some are head-mounted, some require the headto be stable (for example, with a chin rest), and some function remotely andautomatically track the head during motion. Most use a sampling rate of at least30 Hz. Although 50/60 Hz is most common, today many video-based eye trackersrun at 240, 350 or even 1000/1250 Hz, which is needed in order to capture thedetails of the very rapid eye movement during reading or during studies ofneurology.llEye movement is typically divided into fixations and saccades - when the eyegaze pauses in a certain position, and when it moves to another position,respectively. The resulting series of fixations and saccades is called a scanpath.Most information from the eye is made available during a fixation, but not during asaccade. The central one or two degrees of the visual angle (the fovea) providethe bulk of visual information; the input from larger eccentricities (the periphery) isless informative. Hence, the locations of fixations along a scanpath show whatinformation loci on the stimulus were processed during an eye tracking session.On average, fixations last for around 200 ms during the reading of linguistic text,and 350 ms during the viewing of a scene. Preparing a saccade towards a newgoal takes around 200 ms. Scanpaths are useful for analyzing cognitive intent, interest, and salience. Otherbiological factors (some as simple as gender) may affect the scanpath as well.Eye tracking in human-computer interaction (HCI) typically investigates thescanpath for usability purposes, or as a method of input in gaze-contingentdisplays, also known as gaze-based interfaces. Below is a list ofdifferent aspects of eye tracking to be taken into account whenimplementing the adaptive user interface system according to the presentinvention, and more specifically to achieve a system that immediately andsecurely connects, or matches, a user's interaction with a specific position at thedisplay, i.e. a specific information area, and the information presented at thatarea, information area content. And then analyses the stored data to adapt theinformation to be presented. - Gaze direction and gaze point- used in interaction with computers and otherinterfaces, and in behavioural research/human response testing to betterunderstand what attracts people's attention. - Eye-presence detection -the eye-tracking system must first find the eyes, so itis the most fundamental part of eye tracking.12- Eye position -the ability to calculate the position of the eyes in real time makesthe eye tracking system accurate and precise while allowing the user to movefreely. - User identification -the eye tracking system can be used as a multimodalbiometrics sensor, such as for logging on to a computer or for car-driveridentification. lt can combine face identification with physiological eye featuresand eye movement patterns. - Eyelid closure -is used to monitor the user's sleepiness, for instance inadvanced driver assistance or operator safety solutions. - Eye movement and patterns - are studied to understand human behaviour andto assess and diagnose injuries or diseases. - Pupil size and pupil dilation - pupil dilation is an indicator of excitement. lncombination with eye movement patterns and facial expressions, it can be used toderive emotional reactions, for instance in creating innovative user experiences.Pupil dilation can also serve as a marker of impairment, such as concussion, ordrug or alcohol influence. The present invention is not limited to the above-described preferredembodiments. Various alternatives, modifications and equivalents may be used.Therefore, the above embodiments should not be taken as limiting the scope ofthe invention, which is defined by the appending claims.
权利要求:
Claims (22) [1] 1. An adaptive user interface system (2) for a vehicle (3), configured topresent information at an information display (4) of said vehicle, the systemcomprises a control unit (6) for controlling the information presented at saidinformation display (4), and that said information display comprises a plurality of information areas (8),each information area has a predefined position at said information display and isconfigured for presentation of information area content, c h a r a c t e r i z e d in that said system comprises an eye tracking arrangement (10) configured to determine at least one predefinedinformation area parameter (12) related to a user's visual interaction (14) withinformation areas (8) and to generate an eye tracking signal (16) to be applied tosaid control unit (6) including said at least one parameter, and that said control unit (6) is configured to store said determined informationarea parameter(s) for each information area, and to analyse said information areaparameters over time using a set of analyse rules, the control unit is furtherconfigured to determine one or many information areas to be active independence of a set of adaptive presentation rules related to a present user, andto control said information display to present information area content of active information areas. [2] 2. The adaptive user interface system according to claim 1, wherein saidset of analyse rules comprises a rule to perform a frequency analysis of user interaction with information areas. [3] 3. The adaptive user interface system according to claim 2, wherein saidfrequency analysis comprises determining a number oftimes a fixation is made at a specific information area during a predetermined time period. [4] 4. The adaptive user interface system according to any of claims 1-3,wherein said presentation rules includes a rule to determine an information area as active in dependence of the result of said analysing step. 14 [5] 5. The adaptive user interface system according to any of claims 1-4,wherein an information area is active if a number of fixations of that information area are above a predetermined threshold during a predetermined time period. [6] 6. The adaptive user interface system according to any of claims 1-5,wherein said predefined information area parameters include a parameter stating the time duration of a user fixation of an information area. [7] 7. The adaptive user interface system according to any of claims 1-6,wherein said set of presentation rules include a rule including vehicle related parameters. [8] 8. The adaptive user interface system according to any of claims 1-7,wherein said information display is a heads-up display (HUD) at a windscreen ofthe vehicle. [9] 9. The adaptive user interface system according to any of claims 1-7,wherein said information display is an LCD, or another type ofdisplay. [10] 10. The adaptive user interface system according to any of claims 1-9,wherein said control unit is configured to store a user's set of adaptivepresentation rules when a user session is terminated, and to use that set of presentation rules the next time the same user uses the system. [11] 11. A method in an adaptive user interface system for a vehicle,configured to present information at an information display of said vehicle, thesystem comprises a control unit for controlling the information presented at saidinformation display, and that said information display comprises a plurality ofinformation areas, each information area has a predefined position at saidinformation display and is configured for presentation of information area content, c h a r a c t e r i z e d in that said method comprises - determining, by an eye tracking arrangement at least one predefined informationarea parameter related to a user's visual interaction with information areas, - generating an eye tracking signal including said parameter(s), - storing, in said control unit, said determined information area parameter(s) foreach information area, - analysing said information area parameters over time using a set of analyserules, - determining one or many information areas to be active in dependence of a setof adaptive presentation rules related to a present user, - presenting, at said information display, information area content of active information areas. [12] 12. The method according to claim 11, comprising performing frequencyanalysis of user interaction with information areas, by a frequency analysis rule insaid set of analyse rules. [13] 13. The method according to claim 12, comprising determining a numberof times a fixation is made at a specific information area during a predetermined time period by said frequency analysis rule. [14] 14. The method according to any of claims 11-13, comprising, by saidpresentation rules, determining an information area as active in dependence of the result of said analysing step. [15] 15. The method according to any of claims 11-14, wherein an informationarea is active if a number of fixations of that information area are above a predetermined threshold during a predetermined time period. [16] 16. The method according to any of claims 11-15, wherein saidpredefined information area parameters include a parameter stating the time duration of a user fixation of an information area. 16 [17] 17. The method according to any of claims 11-16, wherein said set of presentation rules include a rule including vehicle related parameters. [18] 18. The method according to any of claims 11-17, wherein saidinformation display is a heads-up display (HUD) at a windscreen of the vehicle. [19] 19. The method according to any of claims 11-19, comprising storing auser's set of adaptive presentation rules when a user session is terminated, and using that set of presentation rules the next time the same user uses the system. [20] 20. A computer program P, wherein said computer program P comprisesa computer program code to cause a control unit (10), or a computer connected tothe control unit (10), to perform the method according to any of claims 11-19. [21] 21. A computer program product comprising a computer program codestored on a computer-readable medium to perform the method according to any ofthe claims 11-19, when the computer program code is executed by a control unit (10) or by a computer connected to the control unit (10). [22] 22. A vehicle (3) comprising an adaptive user interface system according to any of claims 1-10.
类似技术:
公开号 | 公开日 | 专利标题 EP3140719B1|2021-07-07|Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects US20150213634A1|2015-07-30|Method and system of modifying text content presentation settings as determined by user states based on user eye metric data KR20170054499A|2017-05-17|Display with eye-discomfort reduction US10448826B2|2019-10-22|Visual function testing device and visual function testing system US9380287B2|2016-06-28|Head mounted system and method to compute and render a stream of digital images using a head mounted display Ellis2009|Eye tracking metrics for workload estimation in flight deck operations US11150469B2|2021-10-19|Method and device for eye tracking using event camera data Sharma et al.2013|Eye gaze techniques for human computer interaction: A research survey Ghosh et al.2015|Real time eye detection and tracking method for driver assistance system KR101571848B1|2015-11-25|Hybrid type interface apparatus based on ElectronEncephaloGraph and Eye tracking and Control method thereof US9934583B2|2018-04-03|Expectation maximization to determine position of ambient glints KR101638095B1|2016-07-20|Method for providing user interface through head mount display by using gaze recognition and bio-signal, and device, and computer-readable recording media using the same SE1451415A1|2016-05-25|Adaptive user interface system Guo et al.2021|Eye-tracking for performance evaluation and workload estimation in space telerobotic training Takada et al.2019|Evaluation of driver’s cognitive load when presented information on the windshield using p300 latency in eye-fixation related potentials CN111753628B|2022-01-11|Training eye tracking model DO HYONG2019|The Study of Visual Attention: From Raw Data to an Interdisciplinary Environment TW202020625A|2020-06-01|The method of identifying fixations real-time from the raw eye- tracking data and a real-time identifying fixations system applying this method Greene1941|Transportation US11222394B2|2022-01-11|Devices and headsets US10761602B1|2020-09-01|Full field retinal imaging system for characterization of eye trackers KR102132294B1|2020-07-09|Method for analyzing virtual reality content information in virtual reality and evaluation terminal adopting the same WO2021247312A1|2021-12-09|Eye-gaze based biofeedback Wong2019|Instantaneous and Robust Pupil-Based Cognitive Load Measurement for Eyewear Computing Dolega et al.2019|Cognitive research on pilot’s eye attention during flight simulator tests
同族专利:
公开号 | 公开日 SE539952C2|2018-02-06| DE102015015136A1|2016-05-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US7764247B2|2006-02-17|2010-07-27|Microsoft Corporation|Adaptive heads-up user interface for automobiles|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 SE1451415A|SE539952C2|2014-11-24|2014-11-24|Adaptive user interface system for a vehicle|SE1451415A| SE539952C2|2014-11-24|2014-11-24|Adaptive user interface system for a vehicle| DE102015015136.3A| DE102015015136A1|2014-11-24|2015-11-23|Adaptive user interface system| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|